53 research outputs found

    Statistical physics of neural systems with non-additive dendritic coupling

    Full text link
    How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such non-additive dendritic processing on single neuron responses and the performance of associative memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality

    Smooth Exact Gradient Descent Learning in Spiking Neural Networks

    Full text link
    Artificial neural networks are highly successfully trained with backpropagation. For spiking neural networks, however, a similar gradient descent scheme seems prohibitive due to the sudden, disruptive (dis-)appearance of spikes. Here, we demonstrate exact gradient descent learning based on spiking dynamics that change only continuously. These are generated by neuron models whose spikes vanish and appear at the end of a trial, where they do not influence other neurons anymore. This also enables gradient-based spike addition and removal. We apply our learning scheme to induce and continuously move spikes to desired times, in single neurons and recurrent networks. Further, it achieves competitive performance in a benchmark task using deep, initially silent networks. Our results show how non-disruptive learning is possible despite discrete spikes

    Stable Irregular Dynamics in Complex Neural Networks

    Full text link
    For infinitely large sparse networks of spiking neurons mean field theory shows that a balanced state of highly irregular activity arises under various conditions. Here we analytically investigate the microscopic irregular dynamics in finite networks of arbitrary connectivity, keeping track of all individual spike times. For delayed, purely inhibitory interactions we demonstrate that the irregular dynamics is not chaotic but rather stable and convergent towards periodic orbits. Moreover, every generic periodic orbit of these dynamical systems is stable. These results highlight that chaotic and stable dynamics are equally capable of generating irregular activity.Comment: 10 pages, 2 figure

    Growing Critical: Self-Organized Criticality in a Developing Neural System

    Get PDF
    Experiments in various neural systems found avalanches: bursts of activity with characteristics typical for critical dynamics. A possible explanation for their occurrence is an underlying network that self-organizes into a critical state. We propose a simple spiking model for developing neural networks, showing how these may "grow into" criticality. Avalanches generated by our model correspond to clusters of widely applied Hawkes processes. We analytically derive the cluster size and duration distributions and find that they agree with those of experimentally observed neuronal avalanches.Comment: 6 pages, 4 figures, supplemental material: 10 pages, 7 figure

    Learning Universal Computations with Spikes

    Get PDF
    Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them

    How Chaotic is the Balanced State?

    Get PDF
    Large sparse circuits of spiking neurons exhibit a balanced state of highly irregular activity under a wide range of conditions. It occurs likewise in sparsely connected random networks that receive excitatory external inputs and recurrent inhibition as well as in networks with mixed recurrent inhibition and excitation. Here we analytically investigate this irregular dynamics in finite networks keeping track of all individual spike times and the identities of individual neurons. For delayed, purely inhibitory interactions we show that the irregular dynamics is not chaotic but stable. Moreover, we demonstrate that after long transients the dynamics converges towards periodic orbits and that every generic periodic orbit of these dynamical systems is stable. We investigate the collective irregular dynamics upon increasing the time scale of synaptic responses and upon iteratively replacing inhibitory by excitatory interactions. Whereas for small and moderate time scales as well as for few excitatory interactions, the dynamics stays stable, there is a smooth transition to chaos if the synaptic response becomes sufficiently slow (even in purely inhibitory networks) or the number of excitatory interactions becomes too large. These results indicate that chaotic and stable dynamics are equally capable of generating the irregular neuronal activity. More generally, chaos apparently is not essential for generating the high irregularity of balanced activity, and we suggest that a mechanism different from chaos and stochasticity significantly contributes to irregular activity in cortical circuits

    Designing complex networks

    No full text
    We suggest a new perspective of research towards understanding the relations between the structure and dynamics of a complex network: can we design a network, e.g. by modifying the features of its units or interactions, such that it exhibits a desired dynamics? Here we present a case study where we positively answer this question analytically for networks of spiking neural oscillators. First, we present a method of finding the set of all networks (defined by all mutual coupling strengths) that exhibit an arbitrary given periodic pattern of spikes as an invariant solution. In such a pattern, all the spike times of all the neurons are exactly predefined. The method is very general, as it covers networks of different types of neurons, excitatory and inhibitory couplings, interaction delays that may be heterogeneously distributed, and arbitrary network connectivities. Second, we show how to design networks if further restrictions are imposed, for instance by predefining the detailed network connectivity. We illustrate the applicability of the method by examples of Erdös–RĂ©nyi and power-law random networks. Third, the method can be used to design networks that optimize network properties. To illustrate this idea, we design networks that exhibit a predefined pattern dynamics while at the same time minimizing the networks’ wiring costs
    • 

    corecore